عنوان مقاله [English]
Declustering of an earthquake catalog is the process of separating an earthquake catalog into foreshocks, main shocks, and aftershocks. Several declustering algorithms have been developed over the years. Up to now, because of the simplicity of the algorithms and the availability of the source codes, most users have applied either the algorithm of Gardner and Knopoff (1974) or Reasenberg (1985). In this article in addition to traditional methods. We use the stochastic declustering approach introduced by Zhuang et al (The Epidemic-Type Aftershock Sequence (ETAS) model). In this model each earthquake has a computable probability to be a background (spontaneous) event or triggered by a previous event. The basis of this model is to find a mathematical function that can describe the seismic sequences in a given area. We use the R package ETAS which is an R implementation (through a C port) of the original Fortran code and produced by Jalilian and Zhuang.For this study, an earthquake catalog of Hormozgan province of Iran is extracted. We select the earthquake data for the period of 1964 - 2016 from the rectangular geographical region 26-29E and 53-59N. We take magnitude threshold ML =4.0, and consider shallow events down to the depth of 30 km. The data are extracted from the International Institute of Seismology and Earthquake Engineering (IIEES).There are many differences between the catalogs declustered using a variety of methods, so a general conclusion is difficult. The results showed that the ETAS model estimates the number of independent earthquakes less than Reasenberg method. In these cases, results were close to each other: assuming 90 percent probability of Gardner and Knopoff method and 50 percent probability of Uhrhammer method.