In this paper we apply to intact rock the four-parameter polyaxial criterion originally proposed by Ottosen  for modelling the peak strength of concrete. The Ottosen criterion is fitted to strength data at both the triaxial compression and triaxial extension stress states, and we use previously published polyaxial data sets to show how this might be done for rock. Additionally, a novel relation between the 2D Hoek-Brown (HB) criterion  and the Ottosen criterion is presented, and we show how this permits the use of the HB parameter m in place of polyaxial test data to determine the four parameters that govern the Ottosen criterion. We suggest that the Ottosen criterion is capable of modelling polyaxial rock strength with reasonable accuracy, generally within ±10%..
In-situ stress magnitude and orientation have long been analyzed separately using classical statistics, despite the fact that stress is a tensor, and it should be analyzed using tensor-related methods. In this paper we investigate the applicability of a two dimensional aleatory model that can handle the stress tensor as a single entity. Firstly, a multivariate normal distribution of tensor components is assumed and the marginal probability functions of the eigen-parameters (principal stresses and rotational angle) derived. Using published in-situ stress data, random stress tensors are generated to compare the distributions of eigen-parameters obtained using classical and tensor statistics. We conclude that for stress magnitude, these two methods give the identical results, whereas for orientation only tensor statistics gives the correct result. Additionally, it is only tensor statistics that is invariant with regard to orientation of the coordinate system. Therefore, we conclude that in practical cases that deal only with stress magnitude, either method can give reliable result, but any analysis involving principal stress orientation requires tensor statistics.
Many parameters used in rock engineering design are characterized using objectively measured data, although at early design stages there often may be insufficient data available (epistemic uncertainty) to define the parameter in question. In such instances, the design process could benefit from the use of Bayesian statistics where informative priors can be constructed from expert knowledge and then updated as further test data become available. In this paper we use small sample sizes of uniaxial compressive strength as an exemplar parameter that exhibits epistemic uncertainty to show how the Bayesian updating takes place. We then use the example of the analysis of rock spalling around an underground opening to show how the results of Bayesian analysis can be used to improve estimates of strength, leading to improved analysis of probability of failure.
In this paper, we compare methods of analysing the instability of an underground wedge that is characterised by uncertain parameters, and draw conclusions regarding the ability of the methods to propagate uncertainty through the problem. Assessment of underground rock wedge instability is a common problem in tunnelling and mining operations, and particularly during the early stages of a project may need to be performed when little or no objectively measured information is available. Such lack of information represents epistemic uncertainty, and although it is often analysed using an aleatory (e.g. probabilistic) model, this is known to be inappropriate. Here, we use the vertex method – an extension of interval mathematics – to show how epistemic uncertainty can be propagated through the analysis. We also discuss the implications of using the vertex method as a means of efficiently allocating resources to additional investigation and testing.