A New Method to Select Training Images in Multi-Point Geostatistics

HTML  XML Download Download as PDF (Size: 13889KB)  PP. 112-129  
DOI: 10.4236/ojogas.2018.32010    900 Downloads   2,781 Views  Citations

ABSTRACT

Training images, as an important modeling parameter in the multi-point geostatistics, directly determine the effect of modeling. It’s necessary to evaluate and select the candidate training image before using the multi-point geostatistical modeling. The overall repetition probability is not sufficient to describe the relationship of single data events in the training image. Based on the understanding, a new method was presented in this paper to select the training image. As is shown in the basic idea, the repetition probability distribution of a single data event was used to characterize the type and stationarity of the sedimentary pattern in the training image. The repetition probability mean value and deviation of single data event reflected the stationarity of the geological model of the training image; the rate of data event mismatching reflected the diversity of geological patterns in training images. The selection of optimal training image was achieved by combining the probability of repeated events and the probability of overall repetition of single data events. It’s illustrated in the simulation tests that a good training image has the advantages of high repetition probability compatibility, stable distribution of repeated probability of single data event, low probability mean value, low probability deviation and low rate of mismatching. The method can quickly select the training image and provide the basic guarantee for multi-point geostatistical simulations.

Share and Cite:

Wang, L. , Yin, Y. and Feng, W. (2018) A New Method to Select Training Images in Multi-Point Geostatistics. Open Journal of Yangtze Oil and Gas, 3, 112-129. doi: 10.4236/ojogas.2018.32010.

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.